Search engine Crawlers are programs or scripts that browse pages on the web automatically, following a predefined system. The crawler search engines are popularly known in the form of spiders, robots or bots. Search engines use crawlers for reading web pages to store a list of words along with their location (Baker, 2005). When someone searches something on a search engine, for instance Google, the person is actually searching Google’s index.